#Algorithms #Series
1. Quadratic Regression
Let be a bijection given by . Then, we define to be the vectorized counterparts of , i.e. Then, by defining we get that: Let then be the least square estimator for with , which is of course polynomial-time computable. Let . Then, it follows that: as .
2. Nuclear and Frobenius norm for Low-Rank
Let with . Then, by Cauchy-Schwarz
3. Second Moment of Random Unit Vectors
Let be an rotation matrix. Then, if where is the unit sphere. We get that:i.e. commutes with all rotation matrices.
We claim that has to be a scalar multiple of the identity. Consider the inclusion . Then, this is a unitary representation by definition. We now show that is irreducible.
Let be a non-zero invariant vector subspace. We claim that . Assume by contradiction that there exists . As is non-zero there exists s.t. and we can indeed find a rotation matrix s.t. , as acts transitively on . This is a contradiction to being invariant and therefore .
This shows that is irreducible and by Schur's lemma, for some . Finally, as trace and expectation commute. This proves the statement.
4. High-Probability Prediction Error Bound
We have:
- As , we have that is Gaussian with:
- As are jointly Gaussian and uncorrelated, they are independent and so are . Now, and using the fact that are -sub-exponential, we get the following: First, Hence, for all , From the sum tail bound, we therefore acquire for ,
- if :
- if :
- From 2, with probability at least , we have that: for . Hence, with the same probability,